Revisiting the Collinear Data Problem: An Assessment of Estimator ‘Ill-Conditioning’ in Linear Regression

نویسندگان

  • Karen Callaghan
  • Jie Chen
چکیده

Linear regression has gained widespread popularity in the social sciences. However, many applications of linear regression have been in situations in which the model data are collinear or ‘ill-conditioned.’ Collinearity renders regression estimates with inflated standard errors. In this paper, we present a method for precisely identifying coefficient estimates that are ill-conditioned, as well as those that are not involved, or only marginally involved in a linear dependency. Diagnostic tools are presented for a hypothetical regression model with ordinary least squares (OLS). It is hoped that practicing researchers will more readily incorporate these diagnostics into their analyses.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Revisiting the Collinear Data Problem: An Assessment of Estimator 'Ill-Conditioning' in Linear Regression - Practical Assessment, Research & Evaluation

Linear regression has gained widespread popularity in the social sciences. However, many applications of linear regression have been in situations in which the model data are collinear or ‘ill-conditioned.’ Collinearity renders regression estimates with inflated standard errors. In this paper, we present a method for precisely identifying coefficient estimates that are ill-conditioned, as well ...

متن کامل

Kernel Ridge Estimator for the Partially Linear Model under Right-Censored Data

Objective: This paper aims to introduce a modified kernel-type ridge estimator for partially linear models under randomly-right censored data. Such models include two main issues that need to be solved: multi-collinearity and censorship. To address these issues, we improved the kernel estimator based on synthetic data transformation and kNN imputation techniques. The key idea of this paper is t...

متن کامل

Minimum-variance Pseudo-unbiased Reduced-rank Estimator (mv-pure) and Its Applications to Ill-conditioned Inverse Problems

This paper presents mathematically novel estimator for the linear regression model named Minimum-Variance Pseudo-Unbiased Reduced-Rank Estimator (MV-PURE), designed specially for applications where the model matrix under consideration is ill-conditioned, and auxiliary knowledge on the unknown deterministic parameter vector is available in the form of linear constraints. We demonstrate closed al...

متن کامل

Positive-Shrinkage and Pretest Estimation in Multiple Regression: A Monte Carlo Study with Applications

Consider a problem of predicting a response variable using a set of covariates in a linear regression model. If it is a priori known or suspected that a subset of the covariates do not significantly contribute to the overall fit of the model, a restricted model that excludes these covariates, may be sufficient. If, on the other hand, the subset provides useful information, shrinkage meth...

متن کامل

Robust Estimation in Linear Regression Model: the Density Power Divergence Approach

The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008